Approximate Expectation Propagation for Bayesian Inference on Large-scale Problems

نویسندگان

  • Yuan Qi
  • Tommi S. Jaakkola
چکیده

where k indexes experimental replicates, i indexes the probe positions, j indexes the binding positions, andN ( jPj aji jjsjbj; i) represents the probability density function of a Gaussian distribution with mean Pj aji jjsjbj and variance i. We assign prior distributions on the binding event bj and the binding strength sj: p(bjj j) = bj j (1 j)1 bj (3) p0(sj) = Gamma(sjjc0; d0) (4) where Gamma( jc0; d0) stands for the probability density functions of Gamma distributions with hyperparameters c0 and d0. We assign a hyperprior distribution on the binding probability j as: p0( j) = Beta( jj 0; 0) (5)

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Stochastic Expectation Propagation

Expectation propagation (EP) is a deterministic approximation algorithm that is often used to perform approximate Bayesian parameter learning. EP approximates the full intractable posterior distribution through a set of local approximations that are iteratively refined for each datapoint. EP can offer analytic and computational advantages over other approximations, such as Variational Inference...

متن کامل

Data Integration for Classification Problems Employing Gaussian Process Priors

By adopting Gaussian process priors a fully Bayesian solution to the problem of integrating possibly heterogeneous data sets within a classification setting is presented. Approximate inference schemes employing Variational & Expectation Propagation based methods are developed and rigorously assessed. We demonstrate our approach to integrating multiple data sets on a large scale protein fold pre...

متن کامل

Deep Gaussian Processes for Regression using Approximate Expectation Propagation

Deep Gaussian processes (DGPs) are multi-layer hierarchical generalisations of Gaussian processes (GPs) and are formally equivalent to neural networks with multiple, infinitely wide hidden layers. DGPs are nonparametric probabilistic models and as such are arguably more flexible, have a greater capacity to generalise, and provide better calibrated uncertainty estimates than alternative deep mod...

متن کامل

Expectation propagation for large scale Bayesian inference of non-linear molecular networks from perturbation data

Inferring the structure of molecular networks from time series protein or gene expression data provides valuable information about the complex biological processes of the cell. Causal network structure inference has been approached using different methods in the past. Most causal network inference techniques, such as Dynamic Bayesian Networks and ordinary differential equations, are limited by ...

متن کامل

Online Spike-and-slab Inference with Stochastic Expectation Propagation

We present OLSS, an online algorithm for Bayesian spike-and-slab model inference, based on the recently proposed stochastic Expectation Propagation (SEP) framework [7]. We use a fully factorized form to efficiently process high dimensional features; further, we extend the SEP framework by incorporating multiple approximate average likelihoods, each of which corresponds to a cluster of samples (...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005